Distributed Mirror Descent over Directed Graphs

نویسندگان

  • Chenguang Xi
  • Qiong Wu
  • Usman A. Khan
چکیده

In this paper, we propose Distributed Mirror Descent (DMD) algorithm for constrained convex optimization problems on a (strongly-)connected multi-agent network. We assume that each agent has a private objective function and a constraint set. The proposed DMD algorithm employs a locally designed Bregman distance function at each agent, and thus can be viewed as a generalization of the well-known Distributed Projected Subgradient (DPS) methods, which use identical Euclidean distances at the agents. At each iteration of the DMD, each agent optimizes its own objective adjusted with the Bregman distance function while exchanging state information with its neighbors. To further generalize DMD, we consider the case where the agent communication follows a directed graph and it may not be possible to design doubly-stochastic weight matrices. In other words, we restrict the corresponding weight matrices to be row-stochastic instead of doubly-stochastic. We study the convergence of DMD in two cases: (i) when the constraint sets at the agents are the same; and, (ii) when the constraint sets at the agents are different. By partially following the spirit of our proof, it can be shown that a class of consensus-based distributed optimization algorithms, restricted to doubly-stochastic matrices, remain convergent with stochastic matrices.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Optimization from Distributed, Streaming Data in Rate-limited Networks

Motivated by machine learning applications in networks of sensors, internet-of-things (IoT) devices, and autonomous agents, we propose techniques for distributed stochastic convex learning from high-rate data streams. The setup involves a network of nodes—each one of which has a stream of data arriving at a constant rate—that solve a stochastic convex optimization problem by collaborating with ...

متن کامل

Data-Distributed Weighted Majority and Online Mirror Descent

In this paper, we focus on the question of the extent to which online learning can benefit from distributed computing. We focus on the setting in whichN agents online-learn cooperatively, where each agent only has access to its own data. We propose a generic datadistributed online learning meta-algorithm. We then introduce the Distributed Weighted Majority and Distributed Online Mirror Descent ...

متن کامل

Designing Distributed Fixed-Time Consensus Protocols for Linear Multi-Agent Systems Over Directed Graphs

This technical note addresses the distributed fixed-time consensus protocol design problem for multi-agent systems with general linear dynamics over directed communication graphs. By using motion planning approaches, a class of distributed fixed-time consensus algorithms are developed, which rely only on the sampling information at some sampling instants. For linear multi-agent systems, the pro...

متن کامل

Distributed strategies for generating weight-balanced and doubly stochastic digraphs

This paper deals with the design and analysis of dynamical systems on directed graphs (digraphs) that achieve weight-balanced and doubly stochastic assignments. Weight-balanced and doubly stochastic digraphs are two classes of digraphs that play an essential role in a variety of coordination problems, including formation control, agreement, and distributed optimization. We refer to a digraph as...

متن کامل

Training Deep Neural Networks via Optimization Over Graphs

In this work, we propose to train a deep neural network by distributed optimization over a graph. Two nonlinear functions are considered: the rectified linear unit (ReLU) and a linear unit with both lower and upper cutoffs (DCutLU). The problem reformulation over a graph is realized by explicitly representing ReLU or DCutLU using a set of slack variables. We then apply the alternating direction...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1412.5526  شماره 

صفحات  -

تاریخ انتشار 2014